Improving on the Independent Metropolis-Hastings Algorithm

نویسندگان

  • Yves F. Atchadé
  • François Perron
چکیده

This paper proposes methods to improve Monte Carlo estimates when the Independent MetropolisHastings Algorithm (IMHA) is used. Our rst approach uses a control variate based on the sample generated by the proposal distribution. We derive the variance of our estimator for a xed sample size n and show that, as n tends to in nity, this variance is asymptotically smaller than the one obtained with the IMHA. Our second approach is based on Jensen's inequality. We use a Rao-Blackwellization and exploit the lack of symmetry in the IMHA. An upper bound on the improvements that we can obtain by these methods is derived. AMS Classi cation: 65C40, 60J22, 60J10.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximating Bayes Estimates by Means of the Tierney Kadane, Importance Sampling and Metropolis-Hastings within Gibbs Methods in the Poisson-Exponential Distribution: A Comparative Study

Here, we work on the problem of point estimation of the parameters of the Poisson-exponential distribution through the Bayesian and maximum likelihood methods based on complete samples. The point Bayes estimates under the symmetric squared error loss (SEL) function are approximated using three methods, namely the Tierney Kadane approximation method, the importance sampling method and the Metrop...

متن کامل

Using parallel computation to improve Independent Metropolis--Hastings based estimation

In this paper, we consider the implications of the fact that parallel raw-power can be exploited by a generic Metropolis–Hastings algorithm if the proposed values are independent from the current value of the Markov chain. In particular, we present improvements to the independent Metropolis–Hastings algorithm that significantly decrease the variance of any estimator derived from the MCMC output...

متن کامل

Adaptive Independent Metropolis–hastings 1

We propose an adaptive independent Metropolis–Hastings algorithm with the ability to learn from all previous proposals in the chain except the current location. It is an extension of the independent Metropolis–Hastings algorithm. Convergence is proved provided a strong Doeblin condition is satisfied, which essentially requires that all the proposal functions have uniformly heavier tails than th...

متن کامل

Adaptive Independent Metropolis – Hastings

We propose an adaptive independent Metropolis–Hastings algorithm with the ability to learn from all previous proposals in the chain except the current location. It is an extension of the independent Metropolis–Hastings algorithm. Convergence is proved provided a strong Doeblin condition is satisfied, which essentially requires that all the proposal functions have uniformly heavier tails than th...

متن کامل

Improving convergence of the Hastings - MetropolisAlgorithm with a learning proposal

The Hastings-Metropolis algorithm is a general MCMC method for sampling from a density known up to a constant. Geometric convergence of this algorithm has been proved under conditions relative to the instrumental distribution (or proposal). We present an inhomogeneous Hastings-Metropolis algorithm for which the proposal density approximates the target density, as the number of iterations increa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004